Incremental Feature Selection by Block Addition and Block Deletion Using Least Squares SVRs

نویسنده

  • Shigeo Abe
چکیده

For a small sample problem with a large number of features, feature selection by cross-validation frequently goes into random tie breaking because of the discrete recognition rate. This leads to inferior feature selection results. To solve this problem, we propose using a least squares support vector regressor (LS SVR), instead of an LS support vector machine (LS SVM). We consider the labels (1/-1) as the targets of the LS SVR and the mean absolute error by cross-validation as the selection criterion. By the use of the LS SVR, the selection and ranking criteria become continuous and thus tie breaking becomes rare. For evaluation, we use incremental block addition and block deletion of features that is developed for function approximation. By computer experiments, we show that performance of the proposed method is comparable with that with the criterion based on the weighted sum of the recognition error rate and the average margin error.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Kobe University Repository : Kernel

In selecting input variables by block addition and block deletion (BABD), multiple input variables are added and then deleted, keeping the cross-validation error below that using all the input variables. The major problem of this method is that selection time becomes large as the number of input variables increases. To alleviate this problem, in this paper, we propose incremental block addition...

متن کامل

Theory of block-pulse functions in numerical solution of Fredholm integral equations of the second ‎kind‎

Recently, the block-pulse functions (BPFs) are used in solving electromagnetic scattering problem, which are modeled as linear Fredholm integral equations (FIEs) of the second kind. But the theoretical aspect of this method has not fully investigated yet. In this article, in addition to presenting a new approach for solving FIE of the second kind, the theory of both methods is investigated as a...

متن کامل

Fast Variable Selection by Block Addition and Block Deletion

We propose the threshold updating method for terminating variable selection and two variable selection methods. In the threshold updating method, we update the threshold value when the approximation error smaller than the current threshold value is obtained. The first variable selection method is the combination of forward selection by block addition and backward selection by block deletion. In...

متن کامل

Component-based Predictive and Exploratory Path Modeling and Multi-block Data Analysis

This discussion paper will focus on the predictive modeling of relationships between latent variables in a multi-block data framework. We will refer to component-based methods such as Partial Least Squares Path Modelling, Generalized Structured Component Analysis as well as to some of their recent variants and other alternatives. We will compare these approaches by paying particular attention t...

متن کامل

Comparison of sparse least squares support vector regressors trained in primal and dual

In our previous work, we have developed sparse least squares support vector regressors (sparse LS SVRs) trained in the primal form in the reduced empirical feature space. In this paper we develop sparse LS SVRs trained in the dual form in the empirical feature space. Namely, first the support vectors that span the reduced empirical feature space are selected by the Cholesky factorization and LS...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014